There's not much time, so I'll just be writing down brief notes.
The notation for NAND and NOR can be thought of as \(\land, \lor\) with vertical bars through the middle of them - i.e. \(\uparrow, \downarrow\).
The notation for XNOR and XOR respectively are \(\equiv\) and \(\not \equiv\) (think about the truth tables.)
I use \(\implies\) with the double arrow, the exam uses \(\rightarrow\) the single arrow. Please bear in mind.
Top \(\top\) and bottom \(\bot\) are True and False respectively.
The degree of a parse tree is the number of inner nodes.
A valuation is "a mapping of a formula to true or false" i.e. what it resolves to. Always true: tautology. Always false: contradiction. Sometimes true: satisfiable.
X is a consequence of S (set) \(S \models X\) if every formula in S is true \(\implies\) X is true. A tautology is a consequence of nothing \(\models X\).
Write a conjunction \(X_1 \land X_2 \land \cdots\) as an angle list \(\langle X_1, X_2, \cdots \rangle\) and a disjunction \(Y_1 \lor Y_2 \lor \cdots\) as a square list \(([)Y_1, Y_2, \cdots]\). X can be a complex formula, or a literal (\(\top, \bot, x, \lnot x\)). A disjunction of literals is a clause and it's conjunction counterpart is a dual clause.
The conjunctive normal form is a conjunction of disjunctions \(\langle [] \rangle\) of literals. And is associated with alpha \(\alpha\).
The disjunctive normal form is a disjunction of conjunctions \([ \langle \rangle ] \) of literals. Or is associated with beta \(\beta\).
Every binary formula under the sun (except xor and xnor) can be reduced to \(A \land B\) or \(A \lor B\), where \(A, B\) can be literals or other formulas, via a combination of simplifying formulas like \(\implies\) and de-morgan's law.
Note \(A \nRightarrow A \equiv \lnot (A \Rightarrow A) \equiv \lnot (\lnot A \lor A) \equiv A \land \lnot A.\)
If a formula simplifies to \(A \land B\), it is an alpha formula. If it simplifies to \(A \lor B\), it is a beta formula.
From a truth table, look at rows with False as output. Write the "inverse or", i.e. if a row is \(x=T, y=T, z=F \longrightarrow F\), then the clause for that row is \([\lnot x, \lnot y, z] \). Put all of these clauses together in a conjunction.
For a formula \(X\), start by writing \(\langle [X] \rangle\), then reduce via algorithm.
In short, the algorithm is represented as:
\[
\begin{array} {|r|r|}\hline \lnot \top & & \lnot \bot & & \lnot \lnot Z & & \beta & & \alpha & \\ \hline \bot & & \top & & Z & & \beta_1 & & \alpha_1 & \alpha_2 \\ \hline & & & & & & \beta_2 & & & \\ \hline \end{array}
\]
Where \(\beta\) is a beta formula (\(\beta_1 \lor \beta_2\)), and \(\alpha\) is an alpha formula (\(\alpha_1 \land \alpha_2\)).
In long:
Repeat until all items are literals.
From a truth table, look at the rows with True as output. Write the and of the literals, i.e. if a row is \(x=T, y=F, z=T \longrightarrow T\), write the dual clause \(\langle x, \lnot y, x \rangle\). Put all of these together in a disjunction.
For a formula start with \([\langle X \rangle] \). The algorithm is identical, except the role of the \(\alpha\) and \(\beta\) is swapped. \[ \begin{array} {|r|r|}\hline \lnot \top & & \lnot \bot & & \lnot \lnot Z & & \beta & & & \alpha \\ \hline \bot & & \top & & Z & & \beta_1 & \beta_2 & & \alpha_1 \\ \hline & & & & & & & & & \alpha_2 \\ \hline \end{array} \]
Two methods, semantic tableau (tree) ~ DNF, and resolution ~ CNF.
Both prove if a formula is a tautology by contradiction.
Tree form. Branches (from root to tip) are conjunctions (ands), and the whole tree's "value" is a disjunction (ors) of branches. We prove by expanding branches. A node can be expanded multiple times, but in a strict tableau expand only once, but this is still sufficient.
To prove a formula \(X\), start with the antiformula \(\lnot X\) as the root of the tree. Now look at the DNF table.
When all branches are closed, X is proved to be a tautology. Denote a tableau proof as \(\vdash_t X\).
The dual to the tableau, this one is just written as a numbered list of disjunctions. Similarly it proves X is a tautology by contradiction.
Make line 1 \([\lnot X] \). Then look at the CNF table
When [] is found then the resolution is considered proved. Denote as \(\vdash_r X\). A line can be "expanded" multiple times, and we also have the idea of "strictness" in resolution.
To prove \(S \models X\) (for a set S of formulas), add an S-introduction rule to both methods.
Denote these proofs as \(S \vdash_t X\) and \(S \vdash_r X\).
Note: all boxes here are done horizontally for easy formatting. They should be vertical in reality.
Two techniques to prove a formula: start with the antiformula, or work backwards (e.g. for \(x \implies y\) we must assume \(x\) and logically conclude \(y\)).
In proving only active formulas not in closed boxes can be used.
Assumptions start a box: \([x \cdots\). Premises are formulas S-introduced when proving consequence, and do not start a box.
A double arrow is used for implies, a single arrow is used for "produces"
Write as \(S \vdash_d X\) if S entails X has a deduction proof.
First Order Logic contains the existing propositional logic connectives, as well as
Define it as \(L(R,F,C)\) where R is a finite/countable set of relation/predicate symbols, F a set of functions, and C a set of constant symbols.
Define Terms:
Define Atmoic formulas as any relation \(R(t_1, \dots, t_n)\) where each \(t\) is a term; or \(\bot, \top\).
Thus define Formulas:
A bound variable is a variable that appears with a quantifier, whilst a free variable does not.
A model for \(L(R,F,C)\) is a pair \(M=(D,I)\) where
The truth value of a formula \(\Phi\) is written \(\Phi^{I,A}\), whilst it has a lot of formal definition, literally just how you work out truth things normally.
A formula is valid, if it is true for every possible assignment of variables, and formulas are satisfiable if there is at least one true assignment.
For FoL, we introduce gamma and delta formulas for \(\forall\) and \(\exists\) respectively
\(\gamma\) | \(\gamma(t)\) |
---|---|
\(\forall x, \Phi\) | \(\Phi\{x / t\}\) |
\(\lnot \exists x : \lnot \Phi\) | \(\lnot \Phi \{x / t\}\) |
\(\delta\) | \(\delta(t)\) |
---|---|
\(\exists x : \Phi\) | \(\Phi \{x/t\} \) |
\(\lnot \forall x , \lnot \Phi\) | \(\lnot \Phi \{ x / t \}\) |
Where \(x / t\) means free occurences of the variable \(x\) is replaced by a term \(t\) (may be represented by a generic placeholder name).
Define par as a set of parameters disjoint from C in \(L(R,F,C)\), and denote \(L(R,F,C \cup par)\) by \(L^{par}\)
The tableau rules are
\(\gamma\) |
\(\gamma(t)\) |
\(\delta\) |
\(\delta(p)\) |
Where \(t\) is any closed term (no variables), and \(p\) is any new parameter.
For example, \(\forall x (R(x) \implies Q(x,y))\) is gamma-replaced by \(R(t) \implies Q(t,y)\).
Note: parameters can be used as closed terms after they are defined.
Also, gamma and delta expansions may need to be used more than once. It may be absolutely necessary that gamma and delta needs to be repeated more than once, with different replacements.
The rules are identical for resolution.
Note that for both of these, S-introduction also remains the same.
The rule is now called \(\gamma\delta\)-elimination, and is... pretty much the same.
There is also \(\gamma\delta \)-introduction:
We want to verify properties about programs.\(\renewcommand{\phi}{\varphi} \) Let \(p\) be a program, and \(\phi\) be a property. If we can prove this property, we write \(p \vdash \phi\).
Syntax falls under domains (i.e. types): E for integer expressions, B for boolean expressions, and C for command expressions. "Expressions" mean you can combine stuff, its not atomic.
Our given programming langauge is defined with the following syntax:
x = E
(for an E expression)C1; C2
if B then { C1 }
(else { C2 }
)
while B { C }
To formally prove certain conditions hold about programs, we define
Hoare triples are triples containing a precondition, a program, and a postcondition. They are written \[ ⦇ Pre ⦈ \; Program \; ⦇ Post ⦈ \]
Hoare triples can be valid: ⦇y = 1⦈ x = y + 1 ⦇x = 2⦈
or invalid: ⦇y = 1⦈ x = y + 1 ⦇x < 1⦈
The process of proving is basically finding intermediate conditions between lines of the program, which makes that specific line of a given program into a valid hoare triple.
Say we have ⦇?⦈ x + 3 ⦇x > 3⦈
, we can find many different preconditions that will satisfy this triple, but for the best effect, we want to mind the most general condition -- the weakest preconditon.
The Weakest Precondition is a valid precondition that is implied by all other valid preconditions. Given program \(P\) and postcondition \(Post\), we denote the weakest precondition as \(wp(P, Post)\).
We calculate it with the following rules:
x = E
\(, Post) = Post[x/E]\) (using the same "replaces" notation as with FoL).P; Q
\(, Post) = wp(P, wp(Q, Post))\)if B then { C1 } else { C2 }
\(, Post)=\)
\begin{align}
= \;&(B \implies wp(C_1, Post)) \land (\lnot B \implies wp(C_2, Post)) \\
\textrm{or } &(B \land wp(C_1, Post)) \lor (\lnot B \land wp(C_2, Post))
\end{align}
A program is a series of instructions: \(C_1, C_2, \dots, C_n\).
Proofs are laid out (vertically) as \[ ⦇Pre⦈ \;C_1 \;⦇\phi_2⦈\; C_2 \;⦇\phi_3⦈\; \cdots \;⦇\phi_{n-1}⦈\; C_n \;⦇Post⦈ \]
The idea is to work backwards from the end, filling in the most appropriate weakest preconditions, until a full chain of valid hoare tuples can be built from the start to the end, via these following rules:
Assigment Rule.
\(⦇Post[x/E]⦈\) x=E
\(⦇Post⦈\)
Ex. ⦇x + y = 10⦈ x = x + y ⦇x = 10⦈
Assignment finds the weakest precondition. We can also always strengthen the precondition with zero problems, and also weaken the postcondition with no problems.
Implication Rule.
\(Pre \implies P \land ⦇P⦈\) Program
\(⦇Q⦈ \land Q \implies Post\)
-> \(⦇Pre⦈\) Program
\(⦇Post⦈\)
Composition Rule.
\(⦇Pre⦈\) C1
\(⦇Mid⦈ \land ⦇Mid⦈\) C2
\(⦇Post⦈\)
-> \(⦇Pre⦈\) C1; C2
\(⦇Post⦈\)
Conditional Rule.
\(⦇Pre \land B⦈\) C1
\(⦇Post⦈ \land ⦇Post \land \lnot B⦈\) C2
\(⦇Post⦈\)
-> \(⦇Pre⦈\) if B then { C1 } else { C2 }
\(⦇Post⦈\)
Ex.
if (x < y) then {
⦇\(\top \land x < y\)⦈
⦇\(x \leq x \land x \leq y\)⦈
z = x
⦇\(z \leq x \land z \leq y\)⦈
} else {
⦇\(\top \land \lnot (x < y )\)⦈
⦇\(y \leq y \land y \leq x\)⦈
z = y
⦇\(z \leq x \land z leq y\)⦈
}
⦇\(z \leq x \land x \leq y\)⦈Loop Rule.
\(B \land L⦈\) C
\(⦇L⦈\)
-> \(⦇L⦈\) while B { C }
\(⦇\lnot B \land L⦈\)
Where \(L\) is a first order logic loop invariant statement, which is something that holds true throughout the loop. Ex.
while (x > 0) {
⦇\(x \geq 0 \land x > 0\)⦈
⦇\(x-1 \geq 0\)⦈
x = x - 1
⦇\(x \geq 0\)⦈
}
⦇\(\lnot (x > 0) \land x \geq 0 \)⦈
⦇\(x = 0 \)⦈The loop invariant used is \(x \geq 0 \). When picking a loop invariant, pick something "useful", whatever that means.